首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   1246篇
  免费   67篇
  国内免费   4篇
电工技术   21篇
综合类   1篇
化学工业   207篇
金属工艺   46篇
机械仪表   26篇
建筑科学   65篇
矿业工程   11篇
能源动力   45篇
轻工业   134篇
水利工程   8篇
石油天然气   7篇
无线电   149篇
一般工业技术   188篇
冶金工业   149篇
原子能技术   20篇
自动化技术   240篇
  2023年   16篇
  2022年   8篇
  2021年   20篇
  2020年   17篇
  2019年   22篇
  2018年   26篇
  2017年   33篇
  2016年   37篇
  2015年   30篇
  2014年   35篇
  2013年   91篇
  2012年   55篇
  2011年   83篇
  2010年   48篇
  2009年   69篇
  2008年   77篇
  2007年   57篇
  2006年   65篇
  2005年   52篇
  2004年   37篇
  2003年   29篇
  2002年   29篇
  2001年   26篇
  2000年   21篇
  1999年   23篇
  1998年   33篇
  1997年   24篇
  1996年   10篇
  1995年   12篇
  1994年   13篇
  1993年   22篇
  1992年   12篇
  1991年   8篇
  1990年   11篇
  1989年   14篇
  1988年   6篇
  1987年   10篇
  1986年   5篇
  1985年   10篇
  1984年   13篇
  1983年   11篇
  1982年   14篇
  1981年   11篇
  1980年   8篇
  1979年   14篇
  1977年   3篇
  1976年   7篇
  1975年   8篇
  1974年   7篇
  1973年   5篇
排序方式: 共有1317条查询结果,搜索用时 562 毫秒
991.
Direct replay of the experience of a user in a virtual environment is difficult for others to watch due to unnatural camera motions. We present methods for replaying and summarizing these egocentric experiences that effectively communicate the user's observations while reducing unwanted camera movements. Our approach summarizes the viewpoint path as a concise sequence of viewpoints that cover the same parts of the scene. The core of our approach is a novel content-dependent metric that can be used to identify similarities between viewpoints. This enables viewpoints to be grouped by similar contextual view information and provides a means to generate novel viewpoints that can encapsulate a series of views. These resulting encapsulated viewpoints are used to synthesize new camera paths that convey the content of the original viewer's experience. Projecting the initial movement of the user back on the scene can be used to convey the details of their observations, and the extracted viewpoints can serve as bookmarks for control or analysis. Finally we present performance analysis along with two forms of validation to test whether the extracted viewpoints are representative of the viewer's original observations and to test for the overall effectiveness of the presented replay methods.  相似文献   
992.
An important component of a spoken term detection (STD) system involves estimating confidence measures of hypothesised detections.A potential problem of the widely used lattice-based confidence estimation,however,is that the confidence scores are treated uniformly for all search terms,regardless of how much they may differ in terms of phonetic or linguistic properties.This problem is particularly evident for out-of-vocabulary (OOV) terms which tend to exhibit high intra-term diversity.To address the impact of term diversity on confidence measures,we propose in this work a term-dependent normalisation technique which compensates for term diversity in confidence estimation.We first derive an evaluation-metric-oriented normalisation that optimises the evaluation metric by compensating for the diverse occurrence rates among terms,and then propose a linear bias compensation and a discriminative compensation to deal with the bias problem that is inherent in lattice-based confidence measurement and from which the Term Specific Threshold (TST) approach suffers.We tested the proposed technique on speech data from the multi-party meeting domain with two state-ofthe-art STD systems based on phonemes and words respectively.The experimental results demonstrate that the confidence normalisation approach leads to a significant performance improvement in STD,particularly for OOV terms with phonemebased systems.  相似文献   
993.

Large amount of data is generated each second around the world. Along with this technology is evolving each day to handle and store such enormous data in an efficient way. Even with the leap in technology, providing storage space for the gigantic data generated per second globally poses a conundrum. One of the main problems, which the storage server or data center are facing is data redundancy. Same data or slightly modified data is stored in the same storage server repeatedly. Thus, the same data is occupying more space due to its multiple copies. To overcome this issue, Data deduplication can be done. Existing deduplication techniques lacks in identifying the data in an efficient way. In our proposed method we are using a mixed mode analytical architecture to address the data deduplication. For that, three level of mapping is introduced. Each level deals with various aspects of the data and the operations carried out to get unique sets of data into the cloud server. The point of focus is effectively solving deduplication to rule out the duplicated data and optimizing the data storage in the cloud server.

  相似文献   
994.
The user experience (UX) is a critical issue in both the product design and service sector. Ensuring that customers can easily understand a product or service and thus ensuring higher satisfaction with the customer–product (service) interaction process is essential for survival among competitive industries. However, few studies have attempted to integrate all relevant factors into a comprehensive model of UX. The present study addresses this need by proposing a conceptual UX interaction model that incorporates the characteristics of usability and user interaction level based on quality of experience. A method to apply the model and the steps to implement the data collection and analysis are also proposed. The UX interaction model requires a systematic approach to: (1) decompose the product or service characteristics during a UX interaction process; (2) determine typology of UX items of each characteristic; and (3) select appropriate and feasible strategies to improve these UX items. A quantitative survey for mobile phone users was developed to investigate differences among types of UX and product characteristics. This study provides valuable empirical evidence on the UX interaction model for particular industries where superior quality service experiences are to be achieved.  相似文献   
995.
We present an algorithm for computing families of geodesic curves over an open mesh patch to partition the patch into strip-like segments. Specifically, the segments can be well approximated using strips obtained by trimming long, rectangular pieces of material having a prescribed width δ. We call this the width-bounded geodesic strip tiling of a curved surface, a problem with practical applications such as the surfacing of curved roofs. The strips are said to be straight since they are constrained to fit within rectangles of width δ, in contrast to arbitrary, possible highly curved, strip segments. The straightness criterion, as well as a bound on strip widths, distinguishes our problem from ones previously studied for developable surface decomposition.  相似文献   
996.
This paper presents a constrained Self-adaptive Differential Evolution (SaDE) algorithm for the design of robust optimal fixed structure controllers with uncertainties and disturbance. Almost all real world optimization problems have constraints which should be satisfied along with the best optimal solution for the problem. In evolutionary algorithms (EAs) the presence of constraints reduces the feasible region and complicates the search process. Therefore, a suitable method to handle the constraints must also be executed. In the SaDE algorithm, four mutation strategies and the control parameter CR are self-adapted. Self-adaptive Penalty (SP) method is introduced into the SaDE algorithm for constraint handling. The performance of SaDE algorithm is demonstrated on the design of robust optimal fixed structure controller of three systems, namely the linearized magnetic levitation system, F-8 aircraft linearized model and a SISO plant. For the comparison purpose, reported results of constrained PSO algorithm and five DE algorithms with different strategies and parameter values are taken into account. Statistical performance in 20 independent runs is considered to compare the performance of algorithms. From the obtained results, it is observed that SaDE algorithm is able to self-adapt the mutation strategy and the crossover rate and hence performs better than the other variants of DE and the constrained PSO algorithm. Better performance of SaDE is achieved by sustained maintenance of diversity throughout the evolutionary process thus producing better individuals consistently. This also aids the algorithm to escape from local optima thereby avoiding premature convergence.  相似文献   
997.
Drawing on social exchange theory, this study proposes a model by postulating critical antecedents and mediators as the key drivers of online learning ability. In the model, online learning ability is affected indirectly by trust via 3 mediators simultaneously, including team commitment, task conflict, and relationship conflict, whereas trust is impacted directly by expressiveness interdependence, outcome interdependence, and task interdependence. Empirical testing of this model, by investigating the personnel of virtual teams from information technology organizations, confirms the applicability of social exchange theory in understanding online learning ability. This study contributes to the virtual team learning literature by extending social exchange theory to the rarely explored area of online learning ability of organizational teams and validating idiosyncratic drivers of online learning ability. Last, this article provides managerial implications and limitations of the research.  相似文献   
998.
Using a patented defect avoidance technique, high yield production of high density SRAM devices (ULSI SRAMs) can be achieved one process generation ahead of the rest of the industry. Production wafer yields as high as 100% and long-term average yields above 80% are reported on Inova's monolithic, 1.2, 320 square mm, one megabit SRAM demonstrating a practical method of achieving wafer scale integration. A yield model is presented and used to determine the optimized architecture and redundancy scheme for Inova's four megabit SRAM and to predict yield as a function of defect density. Achievement of a working 8M-bit experimental device using a 1.2 process is also reported.  相似文献   
999.
1000.
A method was developed to determine kernel moisture content (KMC) and aflatoxin concentration in discrete peanut samples. Shelled peanuts were weighed to the nearest 0.01 g, and a water slurry was made by blending the peanuts for 2 min with 2.2 ml of water per g of peanuts. The slurry (10 g) was withdrawn and dried at 130°C for 3 h to determine KMC. Methanol was added to the remaining slurry and blended for an additional 1 min, and aflatoxins were quantitated with high-performance liquid chromatography. Comparison of the slurry method with an official peanut moisture method showed good agreement between the two over a range of moisture levels. Recovery of aflatoxin B1 from spiked samples averaged 97% with an average coefficient of variation of 3.6%. The method enables determination of both KMC and aflatoxin content in peanut samples without degradation of aflatoxin that would occur when using the official moisture method.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号